回帰分析 - 07

切片,\(\Large \displaystyle \hat{a_0} \),の分散

\(\Large \displaystyle \hat{a_0} \),の分散の推定は,\(\Large \displaystyle \hat{a_1} \),の分散の推定値を利用します.

ここから,

\(\Large \displaystyle \hat{a_0} = a_0 - \left( \hat{a_1} - a_1 \right) \bar{X} + \bar{u} \)

となりますので,分散は,

\(\Large \displaystyle V \left[ \hat{a_0} \right] = E \left[ \hat{a_0} - a_0 \right]^2 = E \left[ - \left( \hat{a_1} - a_1 \right) \bar{X} + \bar{u} \right]^2 \)

\(\Large \displaystyle = \bar{X}^2 E \left[ \hat{a_1} - a_1 \right]^2 -2 \bar{X} E \left[ \left( \hat{a_1} - a_1 \right) \bar{u} \right] + E \left[ \bar{u}^2 \right] \)

となります.各項について考えていきましょう.

・第一項

\(\Large \displaystyle \bar{X}^2 E \left[ \hat{a_1} - a_1 \right]^2 \),について考えていきます.

\(\Large \displaystyle V \left[ \hat{a_1} \right] = E \left[ \left( \hat{a_1} -\bar{ \hat{ a_1}} \right)^2 \right] =E \left[ \left( \hat{a_1} - a_1 \right)^2 \right] \)

ですので(前々頁の\(\Large \displaystyle \hat{a_1} \)の期待値から),

\(\Large \displaystyle \bar{X}^2 E \left[ \hat{a_1} - a_1 \right]^2 = \bar{X}^2 V \left[ \hat{a_1} \right] = \bar{X}^2 \frac{ \sigma^2 }{ \displaystyle \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \)

 

・第二項

\(\Large \displaystyle \hat{a_1} = a_1 + \sum_{i=1}^{n} \omega_i u_i = a_1 + \frac{ \displaystyle \sum_{i=1}^{n} \left(X_i - \bar{X} \right) }{ \displaystyle \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} u_i \)

より,

\(\Large \begin{eqnarray} \displaystyle E \left[ \left( \hat{a_1} - a_1 \right) \bar{u} \right] &=& E \left[ \frac{\sum_{i=1}^{n} \left(X_i - \bar{X} \right)u_i }{\sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \frac{1}{n} \sum_{i=1}^n u_i \right] \\
&=& \frac{1 }{n \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2}E \left[ \sum_{i=1}^{n} \left(X_i - \bar{X} \right)u_i \sum_{i=1}^n u_i \right] \\
&=& \frac{1 }{n \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \sum_{i=1}^{n} \sum_{j=1}^{n} \left(X_i - \bar{X} \right) E \left[ u_i u_j \right] \\
&=& \frac{\sigma^2 }{n \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \sum_{i=1}^{n} \left(X_i - \bar{X} \right) \\
\end{eqnarray} \)

ここで,

\(\Large \displaystyle \sum_{i=1}^{n} \left( X_i - \bar{X} \right) = \sum_{i=1}^{n} X_i - \sum_{i=1}^{n} \bar{X} = n \bar{X} - n \bar{X} = 0 \)

なので,結局,

\(\Large \displaystyle E \left[ \left( \hat{a_1} - a_1 \right) \bar{u} \right] = 0 \)

となります.

 

・第三項

\(\Large \displaystyle E \left[ \bar{u}^2 \right] = E \left[ \sum_{i=1}^n u_i \right]^2 = \frac{1}{n^2} E \left[ \sum_{i=1}^n \sum_{j=1}^n u_i u_j \right]
= \frac{1}{n^2} E \left[ \frac{1}{n} \sum_{i=1}^n u_i^2 \right] \)

\(\Large \displaystyle = \frac{1}{n^2} \sum_{i=1}^n E \left[ u_i^2 \right] = \frac{1}{n^2} \sum_{i=1}^n \sigma^2 = \frac{1}{n} \sigma^2 \)

となります.

 

したがって,

\(\Large \begin{eqnarray} \displaystyle V \left[ \hat{a_0} \right] &=& \bar{X}^2 \frac{\sigma^2 }{\sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} + \frac{\sigma^2}{n} \\
&=& \sigma^2 \frac{n \bar{X}^2 + \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2 }{n \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \\
&=& \sigma^2 \frac{n \bar{X}^2 + \sum_{i=1}^{n} \left( X_i^2 - 2 \bar{X} X_i + \bar{X}^2 \right) }{n \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \\
&=& \sigma^2 \frac{ \sum_{i=1}^{n} X_i^2 - 2 \bar{X} \sum_{i=1}^{n} X_i + 2 n\bar{X}^2 }{n \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \\
&=& \sigma^2 \frac{ \sum_{i=1}^{n} X_i^2 - 2 \bar{X} n \bar{X}+ 2 n\bar{X}^2 }{n \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \\
&=& \sigma^2 \frac{ \sum_{i=1}^{n} X_i^2 }{n \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2} \\
\end{eqnarray} \)

となります.

まとめると,

\(\Large \displaystyle \color{red}{V \left[\hat{a_1} \right] = \frac{\sigma^2 }{ \displaystyle \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2}}\)

\(\Large \displaystyle \color{red}{V \left[ \hat{a_0} \right] = \sigma^2 \frac{ \displaystyle \sum_{i=1}^{n} X_i^2 }{n \displaystyle \sum_{i=1}^{n} \left( X_i - \bar{X} \right)^2}} \)

となります.

これで一件落着かと思ったのですが,

\(\Large \displaystyle \color{red}{ \sigma^2} \)

をどう考えればいいのでしょう????

次ページに検討していきたいと思います.

 

l t r